47 research outputs found

    Least - change bidirectional model transformation With QVT- R and ATL

    Get PDF
    QVT Relations (QVT-R) is the standard language proposed by the OMG to specify bidirectional model transformations. Unfortunately, in part due to ambiguities and omissions in the original semantics, acceptance and development of effective tool support has been slow. Recently, the checking semantics of QVTR has been clarified and formalized. In this article we propose a QVT-R tool that complies to such semantics. Unlike any other existing tool, it also supports metamodels enriched with OCL constraints (thus avoiding returning ill-formed models), and proposes an alternative enforcement semantics that works according to the simple and predictable “principle of least change”. The implementation is based on an embedding of both QVT-R transformations and UML class diagrams (annotated with OCL) in Alloy, a lightweight formal specification language with support for automatic model finding via SAT solving. We also show how this technique can be applied to bidirectionalize ATL, a popular (but unidirectional) model transformation language.This work is funded by ERDF-European Regional Development Fund through the COMPETE Programme (operational programme for competitiveness) and by national funds through the FCT-Fundacao para a Ciencia e a Tecnologia (Portuguese Foundation for Science and Technology) within project FCOMP-01-0124-FEDER-020532. The first author is also sponsored by FCT grant SFRH/BD/69585/2010. The authors would also like to thank all anonymous reviewers for the valuable comments and suggestions

    Formal Verification of Security Protocol Implementations: A Survey

    Get PDF
    Automated formal verification of security protocols has been mostly focused on analyzing high-level abstract models which, however, are significantly different from real protocol implementations written in programming languages. Recently, some researchers have started investigating techniques that bring automated formal proofs closer to real implementations. This paper surveys these attempts, focusing on approaches that target the application code that implements protocol logic, rather than the libraries that implement cryptography. According to these approaches, libraries are assumed to correctly implement some models. The aim is to derive formal proofs that, under this assumption, give assurance about the application code that implements the protocol logic. The two main approaches of model extraction and code generation are presented, along with the main techniques adopted for each approac

    Using theorem provers to increase the precision of dependence analysis for information flow control

    Get PDF
    Information flow control (IFC) is a category of techniques for enforcing information flow properties. In this paper we present the Combined Approach, a novel IFC technique that combines a scalable system-dependence-graph-based (SDG-based) approach with a precise logic-based approach based on a theorem prover. The Combined Approach has an increased precision compared with the SDG-based approach on its own, without sacrificing its scalability. For every potential illegal information flow reported by the SDG-based approach, the Combined Approach automatically generates proof obligations that, if valid, prove that there is no program path for which the reported information flow can happen. These proof obligations are then relayed to the logic-based approach. We also show how the SDG-based approach can provide additional information to the theorem prover that helps decrease the verification effort. Moreover, we present a prototypical implementation of the Combined Approach that uses the tools JOANA and KeY as the SDG-based and logic-based approach respectively

    Enhancing Symbolic Execution of Heap-based Programs with Separation Logic for Test Input Generation

    Get PDF
    Symbolic execution is a well established method for test input generation. Despite of having achieved tremendous success over numerical domains, existing symbolic execution techniques for heap-based programs are limited due to the lack of a succinct and precise description for symbolic values over unbounded heaps. In this work, we present a new symbolic execution method for heap-based programs based on separation logic. The essence of our proposal is context-sensitive lazy initialization, a novel approach for efficient test input generation. Our approach differs from existing approaches in two ways. Firstly, our approach is based on separation logic, which allows us to precisely capture preconditions of heap-based programs so that we avoid generating invalid test inputs. Secondly, we generate only fully initialized test inputs, which are more useful in practice compared to those partially initialized test inputs generated by the state-of-the-art tools. We have implemented our approach as a tool, called Java StarFinder, and evaluated it on a set of programs with complex heap inputs. The results show that our approach significantly reduces the number of invalid test inputs and improves the test coverage

    Integration of Static and Dynamic Analysis Techniques for Checking Noninterference

    Get PDF
    In this article, we present an overview of recent combinations of deductive program verification and automatic test generation on the one hand and static analysis on the other hand, with the goal of checking noninterference. Noninterference is the non-functional property that certain confidential information cannot leak to certain public output, i.e., the confidentiality of that information is always preserved. We define the noninterference properties that are checked along with the individual approaches that we use in different combinations. In one use case, our framework for checking noninterference employs deductive verification to automatically generate tests for noninterference violations with an improved test coverage. In another use case, the framework provides two combinations of deductive verification with static analysis based on system dependence graphs to prove noninterference, thereby reducing the effort for deductive verification

    Concluding Remarks

    No full text
    corecore